An Alternative View of Variational Bayes and Minimum Variational Stochastic Complexity

نویسنده

  • Kazuho Watanabe
چکیده

Bayesian learning is widely used in many applied datamodelling problems and is often accompanied with approximation schemes since it requires intractable computation of the posterior distributions. In this study, we focus on the two approximation methods, the variational Bayes and the local variational approximation. We show that the variational Bayes approach for statistical models with latent variables can be viewed as a special case of the local variational approximation, where the log-sum-exp function is used to form the lower bound of the log-likelihood. The minimum variational stochastic complexity, that is the objective function of the variational Bayes, is also examined and related to the asymptotic theory of Bayesian learning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimum Variational Stochastic Complexity and Average Generalization Error in Latent Variable Models

Bayesian learning is often accomplished with approximation schemes because it requires intractable computation of the posterior distributions. In this paper, focusing on the approximation scheme, variational Bayes method, we investigate the relationship between the asymptotic behavior of variational stochastic complexity or free energy, which is the objective function to be minimized by variati...

متن کامل

On Variational Bayes Algorithms for Exponential Family Mixtures

In this paper, we empirically analyze the behaviors of the Variational Bayes algorithm for the mixture model. While the Variational Bayesian learning has provided computational tractability and good generalization performance in many applications, little has been done to investigate its properties. Recently, the stochastic complexity of mixture models in the Variational Bayesian learning was cl...

متن کامل

A Filtering Approach to Stochastic Variational Inference

Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation to massive data. We present an alternative perspective on SVI as approximate parallel coordinate ascent. SVI trades-off bias and variance to step close to the unknown true coordinate optimum given by batch variational Bayes (VB). We define a model to automate this process. The model infers the l...

متن کامل

Alpha-Divergences in Variational Dropout

We investigate the use of alternative divergences to Kullback-Leibler (KL) in variational inference(VI), based on the Variational Dropout [10]. Stochastic gradient variational Bayes (SGVB) [9] is a general framework for estimating the evidence lower bound (ELBO) in Variational Bayes. In this work, we extend the SGVB estimator with using Alpha-Divergences, which are alternative to divergences to...

متن کامل

Scalable Inference Algorithms for Clustering Large Networks

Clustering is an important task in network analysis, with applications in fields such as biology and the social sciences. We present a novel inference algorithm for the Stochastic Block Model (SBM), a well known network clustering model. Previous inference in this model typically utilizes Markov Chain Monte Carlo or Variational Bayes, but our method is the first to utilize Stochastic Variationa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010